Objective. Covert aspects of ongoing user mental states provide key contextinformation for user-aware human computer interactions. In this paper, we focuson the problem of estimating the vigilance of users using EEG and EOG signals.Approach. To improve the feasibility and wearability of vigilance estimationdevices for real-world applications, we adopt a novel electrode placement forforehead EOG and extract various eye movement features, which contain theprincipal information of traditional EOG. We explore the effects of EEG fromdifferent brain areas and combine EEG and forehead EOG to leverage theircomplementary characteristics for vigilance estimation. Considering that thevigilance of users is a dynamic changing process because the intrinsic mentalstates of users involve temporal evolution, we introduce continuous conditionalneural field and continuous conditional random field models to capture dynamictemporal dependency. Main results. We propose a multimodal approach toestimating vigilance by combining EEG and forehead EOG and incorporating thetemporal dependency of vigilance into model training. The experimental resultsdemonstrate that modality fusion can improve the performance compared with asingle modality, EOG and EEG contain complementary information for vigilanceestimation, and the temporal dependency-based models can enhance theperformance of vigilance estimation. From the experimental results, we observethat theta and alpha frequency activities are increased, while gamma frequencyactivities are decreased in drowsy states in contrast to awake states.Significance. The forehead setup allows for the simultaneous collection of EEGand EOG and achieves comparative performance using only four shared electrodesin comparison with the temporal and posterior sites.
展开▼